28 research outputs found

    No Polynomial Kernels for Knapsack

    Full text link
    This paper focuses on kernelization algorithms for the fundamental Knapsack problem. A kernelization algorithm (or kernel) is a polynomial-time reduction from a problem onto itself, where the output size is bounded by a function of some problem-specific parameter. Such algorithms provide a theoretical model for data reduction and preprocessing and are central in the area of parameterized complexity. In this way, a kernel for Knapsack for some parameter kk reduces any instance of Knapsack to an equivalent instance of size at most f(k)f(k) in polynomial time, for some computable function f()f(\cdot). When f(k)=kO(1)f(k)=k^{O(1)} then we call such a reduction a polynomial kernel. Our study focuses on two natural parameters for Knapsack: The number of different item weights w#w_{\#}, and the number of different item profits p#p_{\#}. Our main technical contribution is a proof showing that Knapsack does not admit a polynomial kernel for any of these two parameters under standard complexity-theoretic assumptions. Our proof discovers an elaborate application of the standard kernelization lower bound framework, and develops along the way novel ideas that should be useful for other problems as well. We complement our lower bounds by showing the Knapsack admits a polynomial kernel for the combined parameter w#+p#w_{\#}+p_{\#}

    An asymptotically optimal online algorithm to minimize the total completion time on two multipurpose machines with unit processing times

    Get PDF
    AbstractIn the majority of works on online scheduling on multipurpose machines the objective is to minimize the makespan. We, in contrast, consider the objective of minimizing the total completion time. For this purpose, we analyze an online-list scheduling problem of n jobs with unit processing times on a set of two machines working in parallel. Each job belongs to one of two sets of job types. Jobs belonging to the first set can be processed on either of the two machines while jobs belonging to the second set can only be processed on the second machine. We present an online algorithm with a competitive ratio of ρLB+O(1n), where ρLB is a lower bound on the competitive ratio of any online algorithm and is equal to 1+(−α+4α3−α2+2α−12α2+1)2 where α=13+16(116−678)1/3+(58+378)1/33(2)2/3≈1.918. This result implies that our online algorithm is asymptotically optimal

    Hardness of Interval Scheduling on Unrelated Machines

    Get PDF

    Scheduling Lower Bounds via AND Subset Sum

    Get PDF
    Given NN instances (X1,t1),,(XN,tN)(X_1,t_1),\ldots,(X_N,t_N) of Subset Sum, the AND Subset Sum problem asks to determine whether all of these instances are yes-instances; that is, whether each set of integers XiX_i has a subset that sums up to the target integer tit_i. We prove that this problem cannot be solved in time O~((Ntmax)1ϵ)\tilde{O}((N \cdot t_{max})^{1-\epsilon}), for tmax=maxitit_{max}=\max_i t_i and any ϵ>0\epsilon > 0, assuming the \forall \exists Strong Exponential Time Hypothesis (\forall \exists-SETH). We then use this result to exclude O~(n+Pmaxn1ϵ)\tilde{O}(n+P_{max} \cdot n^{1-\epsilon})-time algorithms for several scheduling problems on nn jobs with maximum processing time PmaxP_{max}, based on \forall \exists-SETH. These include classical problems such as 1wjUj1||\sum w_jU_j, the problem of minimizing the total weight of tardy jobs on a single machine, and P2UjP_2||\sum U_j, the problem of minimizing the number of tardy jobs on two identical parallel machines.Comment: 14 pages, ICALP'2

    Faster Minimization of Tardy Processing Time on a Single Machine

    Get PDF
    This paper is concerned with the 1pjUj1||\sum p_jU_j problem, the problem of minimizing the total processing time of tardy jobs on a single machine. This is not only a fundamental scheduling problem, but also a very important problem from a theoretical point of view as it generalizes the Subset Sum problem and is closely related to the 0/1-Knapsack problem. The problem is well-known to be NP-hard, but only in a weak sense, meaning it admits pseudo-polynomial time algorithms. The fastest known pseudo-polynomial time algorithm for the problem is the famous Lawler and Moore algorithm which runs in O(Pn)O(P \cdot n) time, where PP is the total processing time of all nn jobs in the input. This algorithm has been developed in the late 60s, and has yet to be improved to date. In this paper we develop two new algorithms for 1pjUj1||\sum p_jU_j, each improving on Lawler and Moore's algorithm in a different scenario. Both algorithms rely on basic primitive operations between sets of integers and vectors of integers for the speedup in their running times. The second algorithm relies on fast polynomial multiplication as its main engine, while for the first algorithm we define a new "skewed" version of (max,min)(\max,\min)-convolution which is interesting in its own right

    Scheduling Two Competing Agents When One Agent Has Significantly Fewer Jobs

    Get PDF
    We study a scheduling problem where two agents (each equipped with a private set of jobs) compete to perform their respective jobs on a common single machine. Each agent wants to keep the weighted sum of completion times of his jobs below a given (agent-dependent) bound. This problem is known to be NP-hard, even for quite restrictive settings of the problem parameters. We consider parameterized versions of the problem where one of the agents has a small number of jobs (and where this small number constitutes the parameter). The problem becomes much more tangible in this case, and we present three positive algorithmic results for it. Our study is complemented by showing that the general problem is NP-complete even when one agent only has a single job
    corecore